Fast Neural Networks with Circulant Projections
نویسندگان
چکیده
The basic computation of a fully-connected neural network layer is a linear projection of the input signal followed by a non-linear transformation. The linear projection step consumes the bulk of the processing time and memory footprint. In this work, we propose to replace the conventional linear projection with the circulant projection. The circulant structure enables the use of the Fast Fourier Transform to speed up the computation. Considering a neural network layer with d input nodes, and d output nodes, this method improves the time complexity from O(d) to O(d log d) and space complexity from O(d) to O(d). We further show that the gradient computation and optimization of the circulant projections can be performed very efficiently. Our experiments on three standard datasets show that the proposed approach achieves this significant gain in efficiency and storage with minimal loss of accuracy compared to neural networks with unstructured projections.
منابع مشابه
Memory Capacity of Neural Networks using a Circulant Weight Matrix
This paper presents results on the memory capacity of a generalized feedback neural network using a circulant matrix. Children are capable of learning soon after birth which indicates that the neural networks of the brain have prior learnt capacity that is a consequence of the regular structures in the brain’s organization. Motivated by this idea, we consider the capacity of circulant matrices ...
متن کاملCirculant Binary Embedding
Binary embedding of high-dimensional data requires long codes to preserve the discriminative power of the input space. Traditional binary coding methods often suffer from very high computation and storage costs in such a scenario. To address this problem, we propose Circulant Binary Embedding (CBE) which generates binary codes by projecting the data with a circulant matrix. The circulant struct...
متن کاملFast Binary Embedding for High-Dimensional Data
Binary embedding of high-dimensional data requires long codes to preserve the discriminative power of the input space. Traditional binary coding methods often suffer from very high computation and storage costs in such a scenario. To address this problem, we propose two solutions which improve over existing approaches. The first method, Bilinear Binary Embedding (BBE), converts highdimensional ...
متن کاملSymmetric Functional Differential Equations and Neural Networks with Memory
We establish an analytic local Hopf bifurcation theorem and a topological global Hopf bifurcation theorem to detect the existence and to describe the spatial-temporal pattern, the asymptotic form and the global continuation of bifurcations of periodic wave solutions for functional differential equations in the presence of symmetry. We apply these general results to obtain the coexistence of mul...
متن کاملEfficient Recurrent Neural Networks using Structured Matrices in FPGAs
Recurrent Neural Networks (RNNs) are becoming increasingly important for time series-related applications which require efficient and real-time implementations. The recent pruning based work ESE (Han et al., 2017) suffers from degradation of performance/energy efficiency due to the irregular network structure after pruning. We propose block-circulant matrices for weight matrix representation in...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1502.03436 شماره
صفحات -
تاریخ انتشار 2015